10,186 research outputs found

    On The Effect of Hyperedge Weights On Hypergraph Learning

    Full text link
    Hypergraph is a powerful representation in several computer vision, machine learning and pattern recognition problems. In the last decade, many researchers have been keen to develop different hypergraph models. In contrast, no much attention has been paid to the design of hyperedge weights. However, many studies on pairwise graphs show that the choice of edge weight can significantly influence the performances of such graph algorithms. We argue that this also applies to hypegraphs. In this paper, we empirically discuss the influence of hyperedge weight on hypegraph learning via proposing three novel hyperedge weights from the perspectives of geometry, multivariate statistical analysis and linear regression. Extensive experiments on ORL, COIL20, JAFFE, Sheffield, Scene15 and Caltech256 databases verify our hypothesis. Similar to graph learning, several representative hyperedge weighting schemes can be concluded by our experimental studies. Moreover, the experiments also demonstrate that the combinations of such weighting schemes and conventional hypergraph models can get very promising classification and clustering performances in comparison with some recent state-of-the-art algorithms

    Learning Hypergraph-regularized Attribute Predictors

    Full text link
    We present a novel attribute learning framework named Hypergraph-based Attribute Predictor (HAP). In HAP, a hypergraph is leveraged to depict the attribute relations in the data. Then the attribute prediction problem is casted as a regularized hypergraph cut problem in which HAP jointly learns a collection of attribute projections from the feature space to a hypergraph embedding space aligned with the attribute space. The learned projections directly act as attribute classifiers (linear and kernelized). This formulation leads to a very efficient approach. By considering our model as a multi-graph cut task, our framework can flexibly incorporate other available information, in particular class label. We apply our approach to attribute prediction, Zero-shot and NN-shot learning tasks. The results on AWA, USAA and CUB databases demonstrate the value of our methods in comparison with the state-of-the-art approaches.Comment: This is an attribute learning paper accepted by CVPR 201

    Intrinsic flat stability of the positive mass theorem for graphical hypersurfaces of Euclidean space

    Full text link
    The rigidity of the Positive Mass Theorem states that the only complete asymptotically flat manifold of nonnegative scalar curvature and zero mass is Euclidean space. We study the stability of this statement for spaces that can be realized as graphical hypersurfaces in Euclidean space. We prove (under certain technical hypotheses) that if a sequence of complete asymptotically flat graphs of nonnegative scalar curvature has mass approaching zero, then the sequence must converge to Euclidean space in the pointed intrinsic flat sense. The appendix includes a new Gromov-Hausdorff and intrinsic flat compactness theorem for sequences of metric spaces with uniform Lipschitz bounds on their metrics.Comment: 31 pages, 2 figures, v2: to appear in Crelle's Journal, many minor changes, one new exampl
    • …
    corecore